2,203 research outputs found
A new heap game
Given heaps of tokens. The moves of the 2-player game introduced
here are to either take a positive number of tokens from at most heaps,
or to remove the {\sl same} positive number of tokens from all the heaps.
We analyse this extension of Wythoff's game and provide a polynomial-time
strategy for it.Comment: To appear in Computer Games 199
Fast Algorithm for Partial Covers in Words
A factor of a word is a cover of if every position in lies
within some occurrence of in . A word covered by thus
generalizes the idea of a repetition, that is, a word composed of exact
concatenations of . In this article we introduce a new notion of
-partial cover, which can be viewed as a relaxed variant of cover, that
is, a factor covering at least positions in . We develop a data
structure of size (where ) that can be constructed in time which we apply to compute all shortest -partial covers for a
given . We also employ it for an -time algorithm computing
a shortest -partial cover for each
Game saturation of intersecting families
We consider the following combinatorial game: two players, Fast and Slow,
claim -element subsets of alternately, one at each turn,
such that both players are allowed to pick sets that intersect all previously
claimed subsets. The game ends when there does not exist any unclaimed
-subset that meets all already claimed sets. The score of the game is the
number of sets claimed by the two players, the aim of Fast is to keep the score
as low as possible, while the aim of Slow is to postpone the game's end as long
as possible. The game saturation number is the score of the game when both
players play according to an optimal strategy. To be precise we have to
distinguish two cases depending on which player takes the first move. Let
and denote the score of
the saturation game when both players play according to an optimal strategy and
the game starts with Fast's or Slow's move, respectively. We prove that
holds
A Cauchy-Dirac delta function
The Dirac delta function has solid roots in 19th century work in Fourier
analysis and singular integrals by Cauchy and others, anticipating Dirac's
discovery by over a century, and illuminating the nature of Cauchy's
infinitesimals and his infinitesimal definition of delta.Comment: 24 pages, 2 figures; Foundations of Science, 201
Leibniz's Infinitesimals: Their Fictionality, Their Modern Implementations, And Their Foes From Berkeley To Russell And Beyond
Many historians of the calculus deny significant continuity between
infinitesimal calculus of the 17th century and 20th century developments such
as Robinson's theory. Robinson's hyperreals, while providing a consistent
theory of infinitesimals, require the resources of modern logic; thus many
commentators are comfortable denying a historical continuity. A notable
exception is Robinson himself, whose identification with the Leibnizian
tradition inspired Lakatos, Laugwitz, and others to consider the history of the
infinitesimal in a more favorable light. Inspite of his Leibnizian sympathies,
Robinson regards Berkeley's criticisms of the infinitesimal calculus as aptly
demonstrating the inconsistency of reasoning with historical infinitesimal
magnitudes. We argue that Robinson, among others, overestimates the force of
Berkeley's criticisms, by underestimating the mathematical and philosophical
resources available to Leibniz. Leibniz's infinitesimals are fictions, not
logical fictions, as Ishiguro proposed, but rather pure fictions, like
imaginaries, which are not eliminable by some syncategorematic paraphrase. We
argue that Leibniz's defense of infinitesimals is more firmly grounded than
Berkeley's criticism thereof. We show, moreover, that Leibniz's system for
differential calculus was free of logical fallacies. Our argument strengthens
the conception of modern infinitesimals as a development of Leibniz's strategy
of relating inassignable to assignable quantities by means of his
transcendental law of homogeneity.Comment: 69 pages, 3 figure
Measurement of L-shell emission from mid-Z targets under non-LTE conditions using Transmission Grating Spectrometer and DANTE power diagnostics
ProducciĂłn CientĂficaIn this work, we present the measurement of L-band emission from buried Sc/V targets in experiments performed at the OMEGA laser facility. The goal of these experiments was to study non-local thermodynamic equilibrium plasmas and benchmark atomic physics codes. The L-band emission was measured simultaneously by the time resolved DANTE power diagnostic and the recently fielded time integrated Soreq-Transmission Grating Spectrometer (TGS) diagnostic. The TGS measurement was used to support the spectral reconstruction process needed for the unfolding of the DANTE data. The Soreq-TGS diagnostic allows for broadband spectral measurement in the 120 eVâ2000 eV spectral band, covering L- and M-shell emission of mid- and high-Z elements, with spectral resolution λ/Îλ = 8â30 and accuracy better than 25%. The Soreq-TGS diagnostic is compatible with ten-inch-manipulator platforms and can be used for a wide variety of high energy density physics, laboratory astrophysics, and inertial confinement fusion experiments
Ten Misconceptions from the History of Analysis and Their Debunking
The widespread idea that infinitesimals were "eliminated" by the "great
triumvirate" of Cantor, Dedekind, and Weierstrass is refuted by an
uninterrupted chain of work on infinitesimal-enriched number systems. The
elimination claim is an oversimplification created by triumvirate followers,
who tend to view the history of analysis as a pre-ordained march toward the
radiant future of Weierstrassian epsilontics. In the present text, we document
distortions of the history of analysis stemming from the triumvirate ideology
of ontological minimalism, which identified the continuum with a single number
system. Such anachronistic distortions characterize the received interpretation
of Stevin, Leibniz, d'Alembert, Cauchy, and others.Comment: 46 pages, 4 figures; Foundations of Science (2012). arXiv admin note:
text overlap with arXiv:1108.2885 and arXiv:1110.545
Huntingtonâs Disease iPSC-Derived Brain Microvascular Endothelial Cells Reveal WNT-Mediated Angiogenic and Blood-Brain Barrier Deficits
Brain microvascular endothelial cells (BMECs) are an essential component of the blood-brain barrier (BBB) that shields the brain against toxins and immune cells. While BBB dysfunction exists in neurological disorders, including Huntington's disease (HD), it is not known if BMECs themselves are functionally compromised to promote BBB dysfunction. Further, the underlying mechanisms of BBB dysfunction remain elusive given limitations with mouse models and post-mortem tissue to identify primary deficits. We undertook a transcriptome and functional analysis of human induced pluripotent stem cell (iPSC)-derived BMECs (iBMEC) from HD patients or unaffected controls. We demonstrate that HD iBMECs have intrinsic abnormalities in angiogenesis and barrier properties, as well as in signaling pathways governing these processes. Thus, our findings provide an iPSC-derived BBB model for a neurodegenerative disease and demonstrate autonomous neurovascular deficits that may underlie HD pathology with implications for therapeutics and drug delivery.American Heart Association (12PRE10410000)American Heart Association (CIRMTG2-01152)National Institutes of Health (U.S.) (NIHNS089076
- âŠ